Personnel
Overall Objectives
Research Program
Application Domains
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Partnerships and Cooperations
Dissemination
Bibliography
XML PDF e-pub
PDF e-Pub


Section: New Results

Estimator selection and statistical tests

Participants : Sylvain Arlot, Matthieu Lerasle.

G. Maillard, S. Arlot and M. Lerasle studied a method mixing cross-validation with aggregation, called aggregated hold-out (Agghoo), which is already used by several practitioners. Agghoo can also be related to bagging. According to numerical experiments, Agghoo can improve significantly cross-validation's prediction error, at the same computational cost; this makes it very promising as a general-purpose tool for prediction. This work provides the first theoretical guarantees on Agghoo, in the supervised classification setting, ensuring that one can use it safely: at worst, Agghoo performs like hold-out, up to a constant factor. A non-asymptotic oracle inequality is also proved, in binary classification under the margin condition, which is sharp enough to get (fast) minimax rates.

With G. Lecué, Matthieu Lerasle working on “learning from MOM's principles”, showing that a recent procedure by Lugosi and Mendelson can be derived by applying Le Cam’s “estimation from tests” procedure to MOM's tests. They also showed some robustness properties of these estimators, proving that the rates of convergence of this estimator are not downgraded even if some “outliers” have corrupted the dataset, and the other data have only first and second moments equal to that of the targeted probability distribution.